![Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility](https://cdn.sanity.io/images/cgdhsj6q/production/97774ea8c88cc8f4bed2766c31994ebc38116948-1664x1366.png?w=400&fit=max&auto=format)
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
@aws-cdk/aws-s3
Advanced tools
Define an unencrypted S3 bucket.
new Bucket(this, 'MyFirstBucket');
Bucket
constructs expose the following deploy-time attributes:
bucketArn
- the ARN of the bucket (i.e. arn:aws:s3:::bucket_name
)bucketName
- the name of the bucket (i.e. bucket_name
)bucketUrl
- the URL of the bucket (i.e.
https://s3.us-west-1.amazonaws.com/onlybucket
)arnForObjects(...pattern)
- the ARN of an object or objects within the
bucket (i.e.
arn:aws:s3:::my_corporate_bucket/exampleobject.png
or
arn:aws:s3:::my_corporate_bucket/Development/*
)urlForObject(key)
- the URL of an object within the bucket (i.e.
https://s3.cn-north-1.amazonaws.com.cn/china-bucket/mykey
)Define a KMS-encrypted bucket:
const bucket = new Bucket(this, 'MyUnencryptedBucket', {
encryption: BucketEncryption.Kms
});
// you can access the encryption key:
assert(bucket.encryptionKey instanceof kms.EncryptionKey);
You can also supply your own key:
const myKmsKey = new kms.EncryptionKey(this, 'MyKey');
const bucket = new Bucket(this, 'MyEncryptedBucket', {
encryption: BucketEncryption.Kms,
encryptionKey: myKmsKey
});
assert(bucket.encryptionKey === myKmsKey);
Use BucketEncryption.ManagedKms
to use the S3 master KMS key:
const bucket = new Bucket(this, 'Buck', {
encryption: BucketEncryption.ManagedKms
});
assert(bucket.encryptionKey == null);
A bucket policy will be automatically created for the bucket upon the first call to
addToResourcePolicy(statement)
:
const bucket = new Bucket(this, 'MyBucket');
bucket.addToResourcePolicy(new iam.PolicyStatement()
.addActions('s3:GetObject')
.addResources(bucket.arnForObjects('file.txt'))
.addAccountRootPrincipal());
Most of the time, you won't have to manipulate the bucket policy directly. Instead, buckets have "grant" methods called to give prepackaged sets of permissions to other resources. For example:
const lambda = new lambda.Function(this, 'Lambda', { /* ... */ });
const bucket = new Bucket(this, 'MyBucket');
bucket.grantReadWrite(lambda.role);
Will give the Lambda's execution role permissions to read and write from the bucket.
This package also defines an Action that allows you to use a Bucket as a source in CodePipeline:
import codepipeline = require('@aws-cdk/aws-codepipeline');
import s3 = require('@aws-cdk/aws-s3');
const sourceBucket = new s3.Bucket(this, 'MyBucket', {
versioned: true, // a Bucket used as a source in CodePipeline must be versioned
});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const sourceAction = new s3.PipelineSourceAction({
actionName: 'S3Source',
bucket: sourceBucket,
bucketKey: 'path/to/file.zip',
});
pipeline.addStage({
name: 'Source',
actions: [sourceAction],
});
You can also create the action from the Bucket directly:
// equivalent to the code above:
const sourceAction = sourceBucket.toCodePipelineSourceAction({
actionName: 'S3Source',
bucketKey: 'path/to/file.zip',
});
By default, the Pipeline will poll the Bucket to detect changes.
You can change that behavior to use CloudWatch Events by setting the pollForSourceChanges
property to false
(it's true
by default).
If you do that, make sure the source Bucket is part of an AWS CloudTrail Trail -
otherwise, the CloudWatch Events will not be emitted,
and your Pipeline will not react to changes in the Bucket.
You can do it through the CDK:
import cloudtrail = require('@aws-cdk/aws-cloudtrail');
const key = 'some/key.zip';
const trail = new cloudtrail.CloudTrail(this, 'CloudTrail');
trail.addS3EventSelector([sourceBucket.arnForObjects(key)], cloudtrail.ReadWriteType.WriteOnly);
const sourceAction = sourceBucket.toCodePipelineSourceAction({
actionName: 'S3Source',
bucketKey: key,
pollForSourceChanges: false, // default: true
});
This package also defines an Action that allows you to use a Bucket as a deployment target in CodePipeline:
import codepipeline = require('@aws-cdk/aws-codepipeline');
import s3 = require('@aws-cdk/aws-s3');
const targetBucket = new s3.Bucket(this, 'MyBucket', {});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const deployAction = new s3.PipelineDeployAction({
actionName: 'S3Deploy',
stage: deployStage,
bucket: targetBucket,
inputArtifact: sourceAction.outputArtifact,
});
const deployStage = pipeline.addStage({
name: 'Deploy',
actions: [deployAction],
});
You can also create the action from the Bucket directly:
// equivalent to the code above:
const deployAction = targetBucket.toCodePipelineDeployAction({
actionName: 'S3Deploy',
extract: false, // default: true
objectKey: 'path/in/bucket', // required if extract is false
inputArtifact: sourceAction.outputArtifact,
});
To use a bucket in a different stack in the same CDK application, pass the object to the other stack:
To import an existing bucket into your CDK application, use the Bucket.import
factory method. This method accepts a
BucketImportProps
which describes the properties of the already existing bucket:
const bucket = Bucket.import(this, {
bucketArn: 'arn:aws:s3:::my-bucket'
});
// now you can just call methods on the bucket
bucket.grantReadWrite(user);
The Amazon S3 notification feature enables you to receive notifications when certain events happen in your bucket as described under S3 Bucket Notifications of the S3 Developer Guide.
To subscribe for bucket notifications, use the bucket.onEvent
method. The
bucket.onObjectCreated
and bucket.onObjectRemoved
can also be used for these
common use cases.
The following example will subscribe an SNS topic to be notified of all ``s3:ObjectCreated:*` events:
const myTopic = new sns.Topic(this, 'MyTopic');
bucket.onEvent(s3.EventType.ObjectCreated, myTopic);
This call will also ensure that the topic policy can accept notifications for this specific bucket.
The following destinations are currently supported:
sns.Topic
sqs.Queue
lambda.Function
It is also possible to specify S3 object key filters when subscribing. The
following example will notify myQueue
when objects prefixed with foo/
and
have the .jpg
suffix are removed from the bucket.
bucket.onEvent(s3.EventType.ObjectRemoved, myQueue, { prefix: 'foo/', suffix: '.jpg' });
Use blockPublicAccess
to specify block public access settings on the bucket.
Enable all block public access settings:
const bucket = new Bucket(this, 'MyBlockedBucket', {
blockPublicAccess: BlockPublicAccess.BlockAll
});
Block and ignore public ACLs:
const bucket = new Bucket(this, 'MyBlockedBucket', {
blockPublicAccess: BlockPublicAccess.BlockAcls
});
Alternatively, specify the settings manually:
const bucket = new Bucket(this, 'MyBlockedBucket', {
blockPublicAccess: new BlockPublicAccess({ blockPublicPolicy: true })
});
When blockPublicPolicy
is set to true
, grantPublicRead()
throws an error.
0.26.0 (2019-03-20)
_toCloudFormation
) (#2047) (515868b), closes #2044 #2016Database
and Table
(#1988) (3117cd3)ContainerImage.fromDockerHub
has been renamed to ContainerImage.fromRegistry
.FAQs
The CDK Construct Library for AWS::S3
The npm package @aws-cdk/aws-s3 receives a total of 99,022 weekly downloads. As such, @aws-cdk/aws-s3 popularity was classified as popular.
We found that @aws-cdk/aws-s3 demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 4 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.